Storing cycles in Hopfield-type networks with pseudoinverse learning rule: Admissibility and network topology

نویسندگان

  • Chuan Zhang
  • Gerhard Dangelmayr
  • Iuliana Oprea
چکیده

Cyclic patterns of neuronal activity are ubiquitous in animal nervous systems, and partially responsible for generating and controlling rhythmic movements such as locomotion, respiration, swallowing and so on. Clarifying the role of the network connectivities for generating cyclic patterns is fundamental for understanding the generation of rhythmic movements. In this paper, the storage of binary cycles in Hopfield-type and other neural networks is investigated. We call a cycle defined by a binary matrix Σ admissible if a connectivity matrix satisfying the cycle's transition conditions exists, and if so construct it using the pseudoinverse learning rule. Our main focus is on the structural features of admissible cycles and the topology of the corresponding networks. We show that Σ is admissible if and only if its discrete Fourier transform contains exactly r=rank(Σ) nonzero columns. Based on the decomposition of the rows of Σ into disjoint subsets corresponding to loops, where a loop is defined by the set of all cyclic permutations of a row, cycles are classified as simple cycles, and separable or inseparable composite cycles. Simple cycles contain rows from one loop only, and the network topology is a feedforward chain with feedback to one neuron if the loop-vectors in Σ are cyclic permutations of each other. For special cases this topology simplifies to a ring with only one feedback. Composite cycles contain rows from at least two disjoint loops, and the neurons corresponding to the loop-vectors in Σ from the same loop are identified with a cluster. Networks constructed from separable composite cycles decompose into completely isolated clusters. For inseparable composite cycles at least two clusters are connected, and the cluster-connectivity is related to the intersections of the spaces spanned by the loop-vectors of the clusters. Simulations showing successfully retrieved cycles in continuous-time Hopfield-type networks and in networks of spiking neurons exhibiting up-down states are presented.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Storing Heteroclinic Cycles in Hopfield-type Neural Networks

This report demonstrates how to use the pseudoinverse learning rule to store patterns and pattern sequences in a Hopfield-type neural network, and briefly discusses the effects of two parameters on the network dynamics.

متن کامل

Storing static and cyclic patterns in an Hopfield neural network

While perceiving recurrent neural networks as brain-like information storing and retrieving machines, it is fundamental to explore at best these storing, indexing and retrieving capacities. This paper reviews an efficient Hebbian learning rule used to store both static and cyclic patterns in the dynamical attractors of an Hopfield neural network. A key improvement will be presented which consis...

متن کامل

Learning Cycles brings Chaos in Continuous Hopfield Networks

This paper aims at studying the impact of an hebbian learning algorithm on the recurrent neural network’s underlying dynamics. Two different kinds of learning are compared in order to encode information in the attractors of the Hopfield neural net: the storing of static patterns and the storing of cyclic patterns. We show that if the storing of static patterns leads to a reduction of the potent...

متن کامل

Tolerance of Pattern Storage Network for Storage and Recalling of Compressed Image using SOM

In this paper we are studying the tolerance of Hopfield neural network for storage and recalling of fingerprint images. The feature extraction of these images is performed with FFT, DWT and SOM. These feature vectors are stored as associative memory in Hopfield Neural Network with Hebbian learning and Pseudoinverse learning rules. The objective of this study is to determine the optimal weight m...

متن کامل

Correlated sequence learning in a network of spiking neurons using maximum likelihood

Hopfield Networks are an idealised model of distributed computation in networks of non-linear, stochastic units. We consider the learning of correlated temporal sequences using Maximum Likelihood, deriving a simple Hebbian-like learning rule that is capable of robustly storing multiple sequences of correlated patterns. We argue that the learning rule is optimal for the case of long temporal seq...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural networks : the official journal of the International Neural Network Society

دوره 46  شماره 

صفحات  -

تاریخ انتشار 2013